Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Japanese Whole Word Masking BERT
# Japanese Whole Word Masking BERT
Bert Base Japanese V2
BERT model pretrained on Japanese Wikipedia using Unidic dictionary for word-level tokenization and whole word masking
Large Language Model
Japanese
B
tohoku-nlp
12.59k
26
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase